RPi MIDI Controller
May 20th 2020
Mira Kim (mk864), Jonathan Gao (jg992)
Objective
A MIDI Controller and Synthesizer with 64 buttons and looping function
Introduction
Music production has evolved from pure acoustic instruments to digital and synthesized instruments. Nowadays, nearly every physical instrument can be emulated using software. With modern audio processing, music production is easier than ever due to faster processors and better digital interfaces.
The predominant digital music standard is MIDI, or Musical Instrument Digital Interface. This is a standard that describes how to encode musical information - such as pitch and volume. Using this protocol, we can encode a song by pitch instead of an audio recording. This enables melodies and songs to be recorded and manipulated as we would when writing music on a music staff. With a device that can output these MIDI signals, we can effectively replace any musical instrument. This is what we aim to accomplish in this project.
While many MIDI controllers exist on the market, they are mostly proprietary hardware and are expensive due to the niche market. Many electric keyboards support MIDI output, but they are large and bulky. Other devices are smaller but come at a premium and rely on proprietary software. We aim to create a flexible, cheap, and simple platform using the Raspberry Pi.
This project consisted primarily of two parts: the MIDI controller hardware and the looper software. The MIDI controller part consists of a button board with 64 buttons connected to a RPi Zero, and the MIDI synthesizer part is RPi B+ connected to a speaker. The MIDI controller part outputs MIDI signal when buttons are pressed, and the MIDI synthesizer takes the MIDI signal and outputs sound. The MIDI synthesizer is also capable of recording a sequence of notes and replaying the recorded sequence in a loop while it can still respond to new button presses at the same time.
Overall system diagram
Design and Testing
MIDI Controller
Design Inspirations
Initial design considerations began with prototypes and guides found online. Many of these projects were aimed at accomplishing similar goals, using an embedded platform to build a MIDI device. A few of the sites we found utilized a similar design: a matrix of buttons laid out on a protoboard [1]–[3]. Of the three, SonnikimEngineering had the best documentation of how he designed his board [3]. This gave me inspiration for how to begin designing mine.
First, we tried to use an Arduino Uno for button input processing. We planned to use the GPIO pins on the Arduino and encode MIDI output messages using a simple mapping algorithm. SonnikimEngineering uses a Teensy board, a dev board similar to Arduino and thus inspired my choice for an Arduino Uno [3]. However, with further research we found out that the Arduino Uno lacks a proper USB interface chip that supports USB gadget configuration. While it was possible, it required a separate programming header as the USB port becomes unusable for programming [4].
Because we had an extra Raspberry Pi Zero, we tried using it as a replacement for the Arduino. While the Arduino is a real-time system, the Raspberry Pi Zero runs Raspbian. We were initially hesitant to use the Raspberry Pi Zero as we feared the non-real-time kernel would lead to troublesome latency and timing issues. However, the RPi Zero did have a USB interface chip that supported USB gadget configuration. This means that the RPi Zero can be configured to be an OTG gadget that emulates some USB device, such as MIDI. This worked out really well as development in Python is far easier than the pseudo-C for the Arduino. Plus, the RPi has significantly higher clock speeds and memory.
Using a guide found online detailing USB OTG gadget configuration and the mido library, we were able to configure the RPi Zero as a MIDI device [5].
The RPi Zero posed some problems initially. Without a Wifi chip or the ability to hold an SSH connection over USB ethernet while simultaneously being a USB MIDI gadget, we were at ends with trying to program the RPi while testing it as a MIDI device. To do so, we needed to reconfigure the USB interface chip, switching between USB Ethernet and USB MIDI.
My professor ended up having extra Raspberry Pi Zero W’s lying around, which helped me circumvent the aforementioned issues by allowing me to SSH over Wifi instead.
Button Matrix
My design follows similar to the previously mentioned prototypes. It consists of 64 buttons laid out on a protoboard, with each button mapping to a MIDI note. However, my design deviates from the others after this point. While the other designs utilize several muxes each to give each button a separate GPIO pin, we decided to use a row-column pattern to give me 64 indexed buttons using only 16 pins.
A schematic of a 2x2 button configuration is shown below. Using the following configuration, it is possible to individually check whether each button is depressed using much fewer pins.
Schematic of a 2x2 button circuit allowing individually indexed buttons.
Such a design works by exploiting current flow through source and sink pins. For my design, we used the row pins as current sources and the column pins as current sinks. The row pins are configured as GPIO OUTPUT LOW, while the column pins are configured as GPIO INPUT with pulldown resistors. This ensures current is limited and is essential so we get a voltage drop to read. Then, to check each button, we simply pull a row pin HIGH and read through every column pin. If the column pin reads HIGH, we know that the button at that row-column is pressed. This process then repeats through each row.
This design naturally lends itself to polling, as it requires we iterate through each pin and column individually. With regards to prior concerns of latency and timing, the RPi’s processor clock speed is magnitudes faster than the latencies that we would be worried about. With RPi GPIO switching speeds capable of reaching the MHz, there were no issues with timing (even after adding 5 us delays for debouncing) [6]. Debouncing turned out unnecessary and didn’t change the behavior of the buttons.
Initially, we started with a 2x2 matrix to test. The first prototype we built did not include the diodes, resulting in some of the row pins sinking current when certain buttons were pressed simultaneously. To prevent this, we included diodes to prevent reverse current flow and only allow current to flow from row pins to column pins. While this would allow for the LEDs to light up on button presses, the 3.3V output from the RPi Zero does not supply enough current when using the pull-down resistors. Even with smaller pull-down resistors, the 3.3V output does not supply enough current for the LEDs to be bright. This can possibly be resolved by using 3.3V LEDs, but then too much current draw might pose a problem.
After, we scaled up to a 8x8 board and soldered everything together on a protoboard.
Programming
We used the mido library, as shown in the guide detailing USB OTG MIDI configuration, for MIDI input and output [5]. To output MIDI, we first have to create a MIDI output port object as shown below.
outport = mido.open_output('f_midi') |
Then, we construct MIDI messages based on note and velocity. To accomplish this, we must decide on a mapping for each button to a MIDI note. Each button on the matrix gets a number, equivalent to the row ID * WIDTH + column ID. Thus, going up the rows we get IDs that increase from 0 to 63. We have an array with 64 elements that each hold a note value. Thus, on each button press, we output the MIDI note at that button ID index in the array as shown below.
chromaticnotemapping = [36,37,38,39,40,41,42,43,41,42,43,44,45,46,47,48,46,47,48,49,50,51,52,53,51,52,53,54,55,56,57,58,56,57,58,59,60,61,62,63,61,62,63,64,65,66,67,68,66,67,68,69,70,71,72,73,71,72,73,74,75,76,77,78]
onmessages = [mido.Message('note_on', note = i, velocity = 127) for i in mapping]
offmessages = [mido.Message('note_off', note = i, velocity = 127) for i in mapping]
Then, to check for button presses, we iterate through each button as below.
while 1:
for i in range(WIDTH):
GPIO.output(xpins[i], 1)
for j in range(HEIGHT):
currentval = GPIO.input(ypins[j])
if currentval != values[i][j]:
if currentval:
outport.send(onmessages[i*WIDTH + j])
else:
outport.send(offmessages[i*WIDTH + j])
values[i][j] = currentval
GPIO.output(xpins[i], 0)
To iterate through the pins, we iterate through each row first, setting the pin HIGH. Then for that pin, we iterate through the columns, reading the voltage. If the voltage is high, then we know the button is pressed. However, instead of sending MIDI messages on every loop, we only send on rising and falling edges. The code above achieves this using a buffer array that stores the previous values read. If the value differs, we output MIDI messages depending on whether it was a rising or falling edge.
This was tested by plugging in the RPi into a computer and reading MIDI from the USB. Multiple applications were used such as FL Studio.
Additional Functions
Some additional functions were implemented after the MIDI function was implemented. Firstly, multiple mappings were created. The first is the chromatic mapping, then a drum mapping, and finally a scale mapping. These are shown below.
chromaticnotemapping = [36,37,38,39,40,41,42,43,41,42,43,44,45,46,47,48,46,47,48,49,50,51,52,53,51,52,53,54,55,56,57,58,56,57,58,59,60,61,62,63,61,62,63,64,65,66,67,68,66,67,68,69,70,71,72,73,71,72,73,74,75,76,77,78]
drummapping = [36,37,38,39,68,69,70,71,40,41,42,43,72,73,74,75,44,45,46,47,76,77,78,79,48,49,50,51,80,81,82,83,52,53,54,55,84,85,86,87,56,57,58,59,88,89,90,91,60,61,62,63,92,93,94,95,64,65,66,67,96,97,98,99]
octavemappingoffset = [0,2,4,5,7,9,11,12]
for i in range(HEIGHT):
for j in range(WIDTH):
octavemapping[i*WIDTH + j] = octavemappingoffset[j] + octavebasenote + i*12
For the scale mapping, we can choose the key of the scale using the octavebasenote variable. This can be changed using a button in the future by changing its value.
With some extra buttons, 5 function buttons were added to the board. Instead of polling these button values, we used callback functions called as an interrupt service routine on each button press. Some functions included a shutdown command and a function to switch mappings.
More functions can be built into the buttons using a shift button. This shift button changes a state variable in the program that activates alternative functions on button press. The details can be seen in the code.
MIDI Synthesizer
The goal of the MIDI synthesizer part was to successfully generate sound from a MIDI signal and record a sequence of sounds to replay in a loop while still responding to new button presses. Testing for this part was done separately first on a Windows environment, and was tested with the RPi at the end when the code was complete. All code was done with Pygame, as we’ve used Pygame before for our past labs and Pygame supports reading MIDI signals and playing sound.
Isolated Testing
Testing MIDI Signal Input
First, the pi should be able to read MIDI Signals correctly. This was done with a sample code MidiIntest.py that reads pygame midi events through the poll() function and prints the note that is being played [7]. Fortunately, we had a keyboard piano available to connect to a laptop to test if midi events are correctly read with this program.
For pygame.midi to work properly, it needs to be initiated and the input port id of the MIDI input needs to be set up with pygame.midi.Input(<port number>). In this code, we detect the default input MIDI device port number and set that to be the Input device. Saving this return value to a variable lets the user poll() from that input to read() the pygame midi events. poll() waits until there is an input, which makes detecting input events simpler. A typical MIDI event string looks like this: [[[status, data1, data2, data3], timestamp], …] where data 1 is the note being played, data 2 is the velocity of the note. In a typical MIDI controller, it sends one MIDI signal when the note is turning on, and another when the note is turning off. For example, when a keyboard piano button is pressed, a MIDI signal with the number of the note and a nonzero velocity is sent out. When you lift your finger from the button, it sends out another MIDI signal with the number of the note with a zero velocity to signify that the note has ended.
Playing Sound with Pygame.Midi
Next, we test that we are able to play sound from pygame.midi. This was tested with a simple code miditest.py which plays three notes in a row. In this case, the set up was all the same as midi input except we had to set up pygame.midi.Output to specify the port to output the sound. We also set the instrument to default by player.set_instrument(0). We use simple functions note_on and note_off to turn on and turn off a specific note. Time.sleep is used to give a delay between on and off so we can hear the note. This part worked without much trouble on the Windows environment. We later found out the output from this pygame function is not sound signals but midi output that requires a soundfont file, so it didn’t work with RPi immediately. In Windows environment, there is a default soundfont file that gets mapped and is able to play the sound. More discussion on this is in the later part for RPi Testing.
Saving Notes to .mid File
For the recording function of the synthesizer, we’ve decided to store the notes into .mid file rather than storing the midi events. This is because there is not a good way to store pygame events and make them wait until the play button is pressed, and event scheduling can get very complicated and have many points of failure especially when trying to play newly pressed notes on top of replaying. Instead, it is much easier to record the sequence of midi signals into a .mid file and replay that file using pygame.mixer while pygame.midi takes care of the live button presses.
This part was tested using MIDIUtil, a python library that lets you write MIDI files from python programs [8]. This part was not tested separately as it is fairly simple, and was implemented into the main code right away. Writing a note into a midi file using MIDIUtil requires information such as note to be played, start time, and duration. More on how this was implemented in the main code is discussed in the “Putting Everything Together” section.
Playing a .mid with Pygame.Mixer
As discussed above, to separate the replay function into pygame.mixer instead of pygame.midi, we use pygame.mixer in this section. This part was tested using a modified code of a sample code, MIDIplay.py [9]. pygame.mixer is initiated with custom frequency, bit size, channel, and buffer and the music file is loaded. Then we use the mixer function play() and while this mixer is busy, make the pygame clock tick() and wait for the file to be done with playing the entire file.
During this process, we encountered problems with python, as it kept giving a timidity error saying it cannot locate the music file even though the file path was set correctly. After many hours of debugging, reinstalling python from scratch resolved the issue. We are assuming the issue was due to some python version issue, even though the originally installed version was also python 2.
Putting Everything Together
With all the above individual testing working, we put everything together into a main code that will be run in the RPi B+. This code is pi3main.py, where everything mentioned above is put together with addition to a pygame screen with record and play button.
There are some basic set up codes on the top. One thing to note here is that pygame.midi needs to be initialized first before pygame.init, or else pygame will throw an error.
Besides some more variables set up, the code starts with a for loop that gets information on all the input and output ports for the user to select which input port to use for MIDI input. This code was taken from a sample code that maps wav files to MIDI notes [10]. The user can see a list of input and output devices and their port numbers on the console, and can type which port they want to use using keyboard input. This is especially useful if multiple MIDI devices are connected or if the program has trouble figuring out which input port to use. MIDI input port is set using this information.
Pygame is initialized, and set up for the pygame display is done. We set a black background with a red “record” button and green “play” button. These buttons change to “recording” and “playing” once toggled. We use render() and get_rect() to create these buttons. Below are screenshots of the display. Then, pygame.mixer is initialized with appropriate input variables. The values for these were decided by default values widely used.
Pygame screen. Left is when it’s not recording nor playing, middle is when recording, right is when playing.
So far has been setting up everything for the main loop of the code. The main code is in a big while loop that runs as long as the code is running, since you don’t want the code to end unless quitted. Within the big while loop, there are two while loops for two modes that the code can be in: recording, and not recording. The main loop was designed this way because the behavior required for appropriate behavior differs greatly when it is recording and not recording.
Below is the behavior for not recording mode. When the mouse click is detected in the area for the record button, the code prepares a new midi file to record notes in, sets “starttime” to current time, changes the button to recording, and toggles the conditionals for entering the non recording and recording while loops. The “starttime” is set to current time because when writing notes to the midi file, it needs information such as note start time relative to the time the midi file is created, and the duration of the note. This variable is later used to properly calculate these values. When a mouse button click is detected in the play button area, there are two cases: when it was not playing and when it was playing already. If it was not playing recorded audio already, variable “playing” is set to true, which is later used to ensure that the recorded audio gets played again and again, and the play button is changed to a brighter playing button. If recorded audio is being played when the button is toggled, music is stopped with pygame.mixer.music.stop() and “playing” is set to false, and the playing button is changed back to the play button. In all of the above cases, pygame display is flipped to ensure that the buttons get updated on the screen.
Within this recording mode loop, it checks for “playing” and if it is true, checks if the mixer is busy (meaning it is still playing the last iteration of the recorded file) and if it’s not busy, loads the recorded audio again and starts playing it again. This makes it so the recorded audio keeps looping and playing while the play button is on. In this non recording mode, there is another if statement to check if there is a new input midi event using poll(). This function waits until there is a new midi event and polls the event only when there is a new one. This was implemented with checking if the velocity value of the midi event is zero or non zero, as usual midi controller devices send a signal with a nonzero velocity when the note turns on, and a zero velocity when the note turns off. This worked perfectly with the keyboard piano we tested with. However, we had to make some changes to this, which will be discussed in detail later in the “Testing With Hardware” part. If the velocity is non zero, we turn on the appropriate note by reading its note value from the midi event.
One problem encountered here was that pygame would not allow pygame.midi and pygame.mixer to be played at the same time in Windows settings. This required using MIDIplay.py and calling it as a subprocess and generating a separate instance of pygame to keep playing the recorded file. This also limited the function of the play button, since there is no simple way to share the state of pygame.mixer of the subprocess to check if it is busy and call the subprocess again once the recorded file is done playing once. Therefore, the functionality was changed so if the play button is pressed it only replays the recorded sound once. This was later solved as the RPi environment allows pygame.mixer and pygame.midi to play at the same time. This will be discussed further in the “Testing with Hardware” part.
Next is the recording loop. This loop is entered when the program is recording notes. The structure of this loop is the same as the non recording loop except there are no checks for the play button, as playing at the same time as recording is not allowed, and a few more things to record the notes being played. The poll() is set up the same way in this loop except when each note is turning on, it stores the current time - starttime into the “notes'' array that keeps track of the start time of the note relative to the time when the midi file is created. This information needs to be stored until the note turns off and the note is written into the midi file. When each note is turning off, the starttime from the “notes” list is popped for the proper index (note number), and the duration of the note is calculated. Using this information, the note is written into the midi file to be generated after the recording ends. When the recording button is pressed in this recording loop, a new midi file “output.mid” is generated with all the recorded midi information. Then the button is updated to “record” from “recording”, and the conditionals for recording loop and non recording loop are inverted.
After all these loops are done, we make sure to delete the midi output, since this is a necessary step to make sure the code would run without an error. We also quit pygame and quit the program.
This program worked well in Mira’s Windows environment before testing it with the actual hardware. This version of the code is listed in the code appendix as pi3main1.py
Testing With Hardware
After the main code was tested to work as expected in Windows environment, it was tested on the Rpi with all the other hardware setup. There were many changes that had to be made because of the environment change.
First, pygame.midi output sound stopped working. This was because in Windows, there is a default soundfont file that gets mapped to the midi output and makes sound. In RPi, we had to run fluidsynth to map a soundfont file for it to start making sound as expected. We had to install and run fluidsynth and make sure to connect the midi output port to fluidsynth input port, and fluidsynth output port to the aux output port. This setup was done with the help of a website that states steps to set up playing midi notes on the RPi [11].
Another big change was that in RPi environment, pygame.midi and pygame.mixer can play sound at the same time in the same program. This wasn’t allowed in Windows, and had to call a subprocess to play the recorded file separately, but after realizing this was not needed, the originally planned code, which is simply using mixer.music.load() and mixer.music.play() was implemented and the code for playing recorded audio became much more simple.
Upon testing with our MIDI Controller, we quickly realized that the midi events were sending nonzero velocity midi signals both when turning on the note and turning off the note. We weren’t sure how to debug this on the controller side, so we had to make changes on the MIDI Synthesizer side. Upon polling in each of the recording and non recording loop, instead of checking for the velocity value in the midi signal, we created an array big enough to cover all the possible note values, and kept track of the status of each note (if it is on or off) by toggling the value stored in the note value’s index between 1 (on) and 0 (off).
One change we planned to make once merging with the hardware that we failed is having the pygame screen display on the pitft screen instead of the RPi desktop. We tried setting it up the same way we did for all the other labs, but it kept giving us an error saying video is not initialized. We tried everything suggested on the internet to fix this, but was unable to have it work and ended up keeping the pygame monitor on the RPi display.
After making all these changes, the program worked as expected just except the pitft screen part. We’ve tried pressing buttons to make sure it makes sound, tried recording sound and playing it back as we pressed more buttons and were able to check that everything was working properly. This version of the code is listed in the code appendix as pi3main2.py. One thing we’ve noticed upon recording notes to output the midi file is that these files are very small in size (a few hundred kilobytes for around 10 seconds of recording). This means that these systems can be used to record very long songs in a very efficient way.
Generalized overall code structure. Note that here, the only change between the main1 and main2 is the if playing part.
Results
Overall, everything performed as planned and we were able to meet the goals outlined in the description. The demo went well with everything that we’ve planned working properly.
Conclusions
For the MIDI controller, the system worked much better than expected. We were expecting that the need to poll for inputs would cause too much latency when reading, but it worked fine.
For the MIDI Synthesizer part, we have met the goals that we’ve set for this project except setting up the pygame display to be put on the pitft screen. We’ve tried to debug this issue, and we’ve tried all solutions suggested by the internet, but weren’t able to solve it. We tried to set this up the same way we’ve done before for all our labs, which used to work, but for some reason we kept getting video not initialized error and couldn’t debug it. Instead, we kept the pygame display on the desktop, which works properly. Besides this issue, the MIDI Synthesizer was able to play sounds from MIDI input signals, record a sequence of sounds, and replay that recorded sound in a loop while also playing sound for new MIDI input signals simultaneously.
Future Work
There are future plans for the controller, such as working on building a proper housing, more comfortable buttons, and adding more functions such as a rotary encoder and potentiometers for adjusting velocity and channels.
Some future work that can be done on the synthesizer is in general adding more functions. There are many more functions that can be added in the synthesizer to modify sound from the MIDI input signals. We could modify it to load different soundfont files, or assign drum beat wav files to each note and make it play beats instead of music notes. There are many different ways more functions can be added and modified without any additional cost, which is the beauty of our RPi system instead of the synthesizers out in the market!
Budget
Part Name | Count | Price |
Raspberry Pi 3 | 1 | |
Raspberry Pi Zero W | 1 | |
Red 5mm LEDs | 64 | |
Switches | 70 | 16.18 |
Protoboard | 1 | 12.50 |
10k Resistors | 64 |
Many parts were sourced from our lab, all the ones without a price named.
References
[1] J. Mae, “DIY 64-Step MIDI Sequencer,” Adafruit Industries - Makers, hackers, artists, designers and engineers!, Jan. 11, 2017. https://blog.adafruit.com/2017/01/11/diy-64-step-midi-sequencer/ (accessed May 18, 2020).
[2] MRecordFollow, “Launchpad / Sequencer With MIDI Output,” Instructables. https://www.instructables.com/id/Launchpad-Sequencer-with-MIDI-output/ (accessed May 18, 2020).
[3] “DIY MIDI step sequencer [Sonniboy mk_4] 3 firmwares description | SonnikimEngineering on Patreon,” Patreon. https://www.patreon.com/posts/diy-midi-step-mk-18099699 (accessed May 18, 2020).
[4] -philicity-Follow, “Turn Your Arduino Uno Into an USB-HID-Mididevice,” Instructables. https://www.instructables.com/id/Turn-your-Arduino-Uno-into-an-USB-HID-Mididevice/ (accessed May 18, 2020).
[5] IxDLAB, “Setting-Up-Raspberry-Pi-for-MIDI.pdf,” IxD Lab at the IT University of Copenhagen. https://ixdlab.itu.dk/wp-content/uploads/sites/17/2017/10/Setting-Up-Raspberry-Pi-for-MIDI.pdf?fbclid=IwAR0G5VT1DlzA_Bcm-5yMhuBh8MkP53IkFheR9lhtrlQoAWskgM-FxAO-6zw (accessed May 18, 2020).
[7] “kushalbhabra/pyMidi,” GitHub. https://github.com/kushalbhabra/pyMidi (accessed May 19, 2020).
[9] “Adafruit I2S Stereo Decoder - UDA1334A,” Adafruit Learning System. https://learn.adafruit.com/adafruit-i2s-stereo-decoder-uda1334a/audio-with-pygame (accessed May 19, 2020).
[11] R. Reinold, “How to build a digital piano using a Raspberry Pi and MIDI Piano controller,” Medium, May 08, 2020. https://medium.com/@rreinold/how-to-use-a-raspberry-pi-3-to-turn-midi-piano-to-into-stand-alone-powered-piano-4aeb79e309ce (accessed May 19, 2020).
Code Appendix
https://github.com/nahtonaj/MIDIcontroller/blob/master/midimatrix.py#L26
For MidiIntest.py, MIDIplay.py, miditest.py, pi3main1.py, pi3main2.py, and an example of output.mid:
https://github.com/mk864mk864/ece5725
Team and Work Distribution
Mira Kim (mk864)
Responsible for overall software, especially the MIDI Synthesizer part.
Jonathan Gao (jg992)
Responsible for overall hardware, and software for hardware integration, especially the MIDI Controller part.